|
In probability theory, and in particular, information theory, the conditional mutual information is, in its most basic form, the expected value of the mutual information of two random variables given the value of a third. ==Definition== For discrete random variables and we define : where the marginal, joint, and/or conditional probability mass functions are denoted by with the appropriate subscript. This can be simplified as : Alternatively, we may write〔K. Makarychev et al. ''A new class of non-Shannon-type inequalities for entropies.'' Communications in Information and Systems, Vol. 2, No. 2, pp. 147–166, December 2002 (PDF )〕 in terms of joint and conditional entropies as : This can be rewritten to show its relationship to mutual information : usually rearranged as the chain rule for mutual information : Another equivalent form of the above is : Conditioning on a third random variable may either increase or decrease the mutual information: that is, the difference , called the interaction information, may be positive, negative, or zero, but it is always true that : for discrete, jointly distributed random variables ''X'', ''Y'', ''Z''. This result has been used as a basic building block for proving other inequalities in information theory, in particular, those known as Shannon-type inequalities. Like mutual information, conditional mutual information can be expressed as a Kullback–Leibler divergence: : 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「conditional mutual information」の詳細全文を読む スポンサード リンク
|